en
AI Ranking
每月不到10元,就可以无限制地访问最好的AIbase。立即成为会员
Home
News
Daily Brief
Income Guide
Tutorial
Tools Directory
Product Library
en
AI Ranking
Search AI Products and News
Explore worldwide AI information, discover new AI opportunities
AI News
AI Tools
AI Cases
AI Tutorial
Type :
AI News
AI Tools
AI Cases
AI Tutorial
2024-09-10 09:38:36
.
AIbase
.
11.7k
AI2 Launches New Open Source Model OLMoE: Efficient, Powerful, and Affordable!
The Allen Institute for Artificial Intelligence (AI2) has released an open-source large language model OLMoE, aimed at providing high performance and low-cost solutions. This model employs a Sparse Mixture of Experts (MoE) architecture with 7 billion parameters, but through an intelligent routing mechanism, only 1 billion parameters are used per input token, achieving efficient computation. OLMoE includes both general and instruction-tuned versions, supporting a context window of 4096 tokens. Its training data is sourced from a wide range, including Common Crawl, Dolma CC, and Wikipedia.